45 research outputs found

    How Many Pairwise Preferences Do We Need to Rank A Graph Consistently?

    Full text link
    We consider the problem of optimal recovery of true ranking of nn items from a randomly chosen subset of their pairwise preferences. It is well known that without any further assumption, one requires a sample size of Ω(n2)\Omega(n^2) for the purpose. We analyze the problem with an additional structure of relational graph G([n],E)G([n],E) over the nn items added with an assumption of \emph{locality}: Neighboring items are similar in their rankings. Noting the preferential nature of the data, we choose to embed not the graph, but, its \emph{strong product} to capture the pairwise node relationships. Furthermore, unlike existing literature that uses Laplacian embedding for graph based learning problems, we use a richer class of graph embeddings---\emph{orthonormal representations}---that includes (normalized) Laplacian as its special case. Our proposed algorithm, {\it Pref-Rank}, predicts the underlying ranking using an SVM based approach over the chosen embedding of the product graph, and is the first to provide \emph{statistical consistency} on two ranking losses: \emph{Kendall's tau} and \emph{Spearman's footrule}, with a required sample complexity of O(n2χ(Gˉ))23O(n^2 \chi(\bar{G}))^{\frac{2}{3}} pairs, χ(Gˉ)\chi(\bar{G}) being the \emph{chromatic number} of the complement graph Gˉ\bar{G}. Clearly, our sample complexity is smaller for dense graphs, with χ(Gˉ)\chi(\bar G) characterizing the degree of node connectivity, which is also intuitive due to the locality assumption e.g. O(n43)O(n^\frac{4}{3}) for union of kk-cliques, or O(n53)O(n^\frac{5}{3}) for random and power law graphs etc.---a quantity much smaller than the fundamental limit of Ω(n2)\Omega(n^2) for large nn. This, for the first time, relates ranking complexity to structural properties of the graph. We also report experimental evaluations on different synthetic and real datasets, where our algorithm is shown to outperform the state-of-the-art methods.Comment: In Thirty-Third AAAI Conference on Artificial Intelligence, 201

    Integrated mmWave Access and Backhaul in 5G: Bandwidth Partitioning and Downlink Analysis

    Full text link
    With the increasing network densification, it has become exceedingly difficult to provide traditional fiber backhaul access to each cell site, which is especially true for small cell base stations (SBSs). The increasing maturity of millimeter wave (mmWave) communication has opened up the possibility of providing high-speed wireless backhaul to such cell sites. Since mmWave is also suitable for access links, the third generation partnership project (3GPP) is envisioning an integrated access and backhaul (IAB) architecture for the fifth generation (5G) cellular networks in which the same infrastructure and spectral resources will be used for both access and backhaul. In this paper, we develop an analytical framework for IAB-enabled cellular network using which we provide an accurate characterization of its downlink rate coverage probability. Using this, we study the performance of two backhaul bandwidth (BW) partition strategies, (i) equal partition: when all SBSs obtain equal share of the backhaul BW, and (ii) load-based partition: when the backhaul BW share of an SBS is proportional to its load. Our analysis shows that depending on the choice of the partition strategy, there exists an optimal split of access and backhaul BW for which the rate coverage is maximized. Further, there exists a critical volume of cell-load (total number of users) beyond which the gains provided by the IAB-enabled network disappear and its performance converges to that of the traditional macro-only network with no SBSs
    corecore